Natural Language Inference, Sentence representation and Attention Mechanism

نویسنده

  • Cyprien de Lichy
چکیده

A characteristic of natural language is that there are many different ways to express a statement: several meanings can be contained in a single text and the same meaning can be conveyed by different texts. Understanding entailment becomes cruvial to understanding natural language. Natural language inference constitutes an effective way to evaluate machine reading algorithms. It is an interesting problem as it reflects our ability to extract and represent the meaning at a sentence level, and it takes into account several characteristics of natural language such as scope, syntactic structure, lexical ambiguity, semantic compositionality.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention

In this paper, we proposed a sentence encoding-based model for recognizing text entailment. In our approach, the encoding of sentence is a two-stage process. Firstly, average pooling was used over word-level bidirectional LSTM (biLSTM) to generate a firststage sentence representation. Secondly, attention mechanism was employed to replace average pooling on the same sentence for better represent...

متن کامل

Syntax-based Attention Model for Natural Language Inference

Introducing attentional mechanism in neural network is a powerful concept, and has achieved impressive results in many natural language processing tasks. However, most of the existing models impose attentional distribution on a flat topology, namely the entire input representation sequence. Clearly, any well-formed sentence has its accompanying syntactic tree structure, which is a much rich top...

متن کامل

Recurrent Neural Network-Based Sentence Encoder with Gated Attention for Natural Language Inference

The RepEval 2017 Shared Task aims to evaluate natural language understanding models for sentence representation, in which a sentence is represented as a fixedlength vector with neural networks and the quality of the representation is tested with a natural language inference task. This paper describes our system (alpha) that is ranked among the top in the Shared Task, on both the in-domain test ...

متن کامل

DiSAN: Directional Self-Attention Network for RNN/CNN-free Language Understanding

Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used in NLP tasks to capture the longterm and local dependencies respectively. Attention mechanisms have recently attracted enormous interest due to their highly parallelizable computation, significantly less training time, and flexibility in modeling dependencies. We propose a novel attention mechanism in which the atte...

متن کامل

Evaluating Multimodal Representations on Sentence Similarity: vSTS, Visual Semantic Textual Similarity Dataset

The success of word representations (embeddings) learned from text has motivated analogous methods to learn representations of longer sequences of text such as sentences, a fundamental step on any task requiring some level of text understanding [13]. Sentence representation is a challenging task that has to consider aspects such as compositionality, phrase similarity, negation, etc. In order to...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016